Variance estimation in nonparametric regression via the difference sequence method (short title: Sequence-based variance estimation)

نویسنده

  • Lawrence D. Brown
چکیده

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for suitable asymptotic formulations our estimators achieve the minimax rate. ∗AMS 2000 Subject Classification 62G08, 62G20 †

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variance estimation in nonparametric regression via the difference sequence method ( short title :

Consider the standard Gaussian nonparametric regression problem. The observations are (xi, yi) where and where ~i are iid with finite fourth moment p4 < oo. This article presents a class of difference-based kernel estimators for the variance *AMS 2000 Subject Classification 62G08, 62G20 t ~ e ~ w o r d s and Phrases: Nonparametric regression, Variance estimation, Asymptotic minimaxity he work o...

متن کامل

Variance Estimation in Nonparametric Regression via the Difference Sequence Method

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...

متن کامل

Variance Estimation in Nonparametric Regression via the Difference Sequence Method by Lawrence

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...

متن کامل

Asymptotically optimal differenced estimators of error variance in nonparametric regression

The existing differenced estimators of error variance in nonparametric regression are interpreted as kernel estimators, and some requirements for a ‘‘good’’ estimator of error variance are specified. A new differenced method is then proposed that estimates the errors as the intercepts in a sequence of simple linear regressions and constructs a variance estimator based on estimated errors. The n...

متن کامل

Derivative estimation based on difference sequence via locally weighted least squares regression

A new method is proposed for estimating derivatives of a nonparametric regression function. By applying Taylor expansion technique to a derived symmetric difference sequence, we obtain a sequence of approximate linear regression representation in which the derivative is just the intercept term. Using locally weighted least squares, we estimate the derivative in the linear regression model. The ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006